Blog posts tagged with 'how to boost your seo with google adwords'

RSS
Duplicate Content Filter: What it is and how it works - Tuesday, September 06, 2011

Duplicate Content has become a huge topic of discussion lately, thanks to the new filters that search engines have implemented. This article will help you understand why you might be caught in the filter, and ways to avoid it. We'll also show you how you can determine if your pages have duplicate content, and what to do to fix it.

Search engine spam is any deceitful attempts to deliberately trick the search engine into returning inappropriate, redundant, or poor-quality search results. Many times this behavior is seen in pages that are exact replicas of other pages which are created to receive better results in the search engine. Many people assume that creating multiple or similar copies of the same page will either increase their chances of getting listed in search engines or help them get multiple listings, due to the presence of more keywords.

In order to make a search more relevant to a user, search engines use a filter that removes the duplicate content pages from the search results, and the spam along with it. Unfortunately, good, hardworking webmasters have fallen prey to the filters imposed by the search engines that remove duplicate content. It is those webmasters who unknowingly spam the search engines, when there are some things they can do to avoid being filtered out. In order for you to truly understand the concepts you can implement to avoid the duplicate content filter, you need to know how this filter works.

First, we must understand that the term "duplicate content penalty" is actually a misnomer. When we refer to penalties in search engine rankings, we are actually talking about points that are deducted from a page in order to come to an overall relevancy score. But in reality, duplicate content pages are not penalized. Rather they are simply filtered, the way you would use a sieve to remove unwanted particles. Sometimes, "good particles" are accidentally filtered out.

Knowing the difference between the filter and the penalty, you can now understand how a search engine determines what duplicate content is. There are basically four types of duplicate content that are filtered out:

  1. Websites with Identical Pages - These pages are considered duplicate, as well as websites that are identical to another website on the Internet are also considered to be spam. Affiliate sites with the same look and feel which contain identical content, for example, are especially vulnerable to a duplicate content filter. Another example would be a website with doorway pages. Many times, these doorways are skewed versions of landing pages. However, these landing pages are identical to other landing pages. Generally, doorway pages are intended to be used to spam the search engines in order to manipulate search engine results.
  2. Scraped Content - Scraped content is taking content from a web site and repackaging it to make it look different, but in essence it is nothing more than a duplicate page. With the popularity of blogs on the internet and the syndication of those blogs, scraping is becoming more of a problem for search engines.
  3. E-Commerce Product Descriptions - Many eCommerce sites out there use the manufacturer's descriptions for the products, which hundreds or thousands of other eCommerce stores in the same competitive markets are using too. This duplicate content, while harder to spot, is still considered spam.
  4. Distribution of Articles - If you publish an article, and it gets copied and put all over the Internet, this is good, right? Not necessarily for all the sites that feature the same article. This type of duplicate content can be tricky, because even though Yahoo and MSN determine the source of the original article and deems it most relevant in search results, other search engines like Google may not, according to some experts.

So, how does a search engine's duplicate content filter work? Essentially, when a search engine robot crawls a website, it reads the pages, and stores the information in its database. Then, it compares its findings to other information it has in its database. Depending upon a few factors, such as the overall relevancy score of a website, it then determines which are duplicate content, and then filters out the pages or the websites that qualify as spam. Unfortunately, if your pages are not spam, but have enough similar content, they may still be regarded as spam.

There are several things you can do to avoid the duplicate content filter. First, you must be able to check your pages for duplicate content. Using our Similar Page Checker, you will be able to determine similarity between two pages and make them as unique as possible. By entering the URLs of two pages, this tool will compare those pages, and point out how they are similar so that you can make them unique.

Since you need to know which sites might have copied your site or pages, you will need some help. We recommend using a tool that searches for copies of your page on the Internet: www.copyscape.com. Here, you can put in your web page URL to find replicas of your page on the Internet. This can help you create unique content, or even address the issue of someone "borrowing" your content without your permission.

Let's look at the issue regarding some search engines possibly not considering the source of the original content from distributed articles. Remember, some search engines, like Google, use link popularity to determine the most relevant results. Continue to build your link popularity, while using tools like www.copyscape.com to find how many other sites have the same article, and if allowed by the author, you may be able to alter the article as to make the content unique.

If you use distributed articles for your content, consider how relevant the article is to your overall web page and then to the site as a whole. Sometimes, simply adding your own commentary to the articles can be enough to avoid the duplicate content filter; the Similar Page Checker could help you make your content unique. Further, the more relevant articles you can add to compliment the first article, the better. Search engines look at the entire web page and its relationship to the whole site, so as long as you aren't exactly copying someone's pages, you should be fine.

If you have an eCommerce site, you should write original descriptions for your products. This can be hard to do if you have many products, but it really is necessary if you wish to avoid the duplicate content filter. Here's another example why using the Similar Page Checker is a great idea. It can tell you how you can change your descriptions so as to have unique and original content for your site. This also works well for scraped content also. Many scraped content sites offer news. With the Similar Page Checker, you can easily determine where the news content is similar, and then change it to make it unique.

Do not rely on an affiliate site which is identical to other sites or create identical doorway pages. These types of behaviors are not only filtered out immediately as spam, but there is generally no comparison of the page to the site as a whole if another site or page is found as duplicate, and get your entire site in trouble.

The duplicate content filter is sometimes hard on sites that don't intend to spam the search engines. But it is ultimately up to you to help the search engines determine that your site is as unique as possible. By using the tools in this article to eliminate as much duplicate content as you can, you'll help keep your site original and fresh.

Comments (0)
What is Robots.txt - Tuesday, September 06, 2011

Robots.txt

It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages (although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.

One way to tell search engines which files and folders on your Web site to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed. A better way to inform search engines about your will is to use a robots.txt file.

What Is Robots.txt?

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.

The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.

The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.

Structure of a Robots.txt File

The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:

User-agent:

Disallow:

User-agent” are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:

# All user agents are disallowed to see the /temp directory.

User-agent: *

Disallow: /temp/

The Traps of a Robots.txt File

When you start making complicated files – i.e. you decide to allow different user agents access to different directories – problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.

The more serious problem is with logical errors. For instance:

User-agent: *

Disallow: /temp/

User-agent: Googlebot

Disallow: /images/

Disallow: /temp/

Disallow: /cgi-bin/

The above example is from a robots.txt that allows all agents to access everything on the site except the /temp directory. Up to here it is fine but later on there is another record that specifies more restrictive terms for Googlebot. When Googlebot starts reading robots.txt, it will see that all user agents (including Googlebot itself) are allowed to all folders except /temp/. This is enough for Googlebot to know, so it will not read the file to the end and will index everything except /temp/ - including /images/ and /cgi-bin/, which you think you have told it not to touch. You see, the structure of a robots.txt file is simple but still serious mistakes can be made easily.

Tools to Generate and Validate a Robots.txt File

Having in mind the simple syntax of a robots.txt file, you can always read it to see if everything is OK but it is much easier to use a validator, like this one: http://tool.motoricerca.info/robots-checker.phtml. These tools report about common mistakes like missing slashes or colons, which if not detected compromise your efforts. For instance, if you have typed:

User agent: *

Disallow: /temp/

this is wrong because there is no slash between “user” and “agent” and the syntax is incorrect.

In those cases, when you have a complex robots.txt file – i.e. you give different instructions to different user agents or you have a long list of directories and subdirectories to exclude, writing the file manually can be a real pain. But do not worry – there are tools that will generate the file for you. What is more, there are visual tools that allow to point and select which files and folders are to be excluded. But even if you do not feel like buying a graphical tool for robots.txt generation, there are online tools to assist you. For instance, the Server-Side Robots Generator offers a dropdown list of user agents and a text box for you to list the files you don't want indexed. Honestly, it is not much of a help, unless you want to set specific rules for different search engines because in any case it is up to you to type the list of directories but is more than nothing.

Comments (0)
Your Website from Google Banned to Google Unbanned - Tuesday, September 06, 2011

Even if you are not looking for trouble and do not violate any known Google SEO rule, you still might have to experience the ultimate SEO nightmare - being excluded from Google’s index. Although Google is a kind of a monopolist among search engines, it is not a bully company that excludes innocent victims for pure pleasure. Google keeps rigorously to SEO best practices and excludes sites that misbehave.

If you own and run a blog or website then being listed by Google is a very important step so it is read by as many people as possible; but what if your website gets Google banned? If this has happened to you, then you know that it hurts your site because you won’t show up in the Google search engine and that means less traffic to your site. Getting unbanned from Google is a long and drawn out process. And sometimes Google won’t even tell you the reason they banned your website in the first place, which doesn’t make things any easier.

Some of the ways a site can be Google banned include having spam on it, putting in too many keywords that clog up your site, making your owned URLs redirect to each other, improperly inserting a robot.txt file, duplicating your own pages and sending people to them over and over, and linking to bad sites like those with adult content, gambling or other unauthorized areas. There are multiple other reasons, so it’s a good idea to try to get them to let you know the reason for being Google banned. That will make it much simpler to fix the problem. Over-optimization has many faces and you can have a look at the Optimization, Over-Optimization or SEO Overkill? Article to get some ideas of practices that you should avoid.

Here are the necessary steps that you need to follow in order to get Google reconsideration for getting unbanned. Be sure to follow Google reconsideration request process precisely and correctly if you want to get your website unbanned and get your site back in business providing whatever products or services that it has:

1 Send an Google Reconsideration Request for getting Unbanned

Getting Google reinclusion of your website requires putting in a Google reconsideration request. First, the way you know your site is Google banned is that suddenly it doesn’t have a page rank on it. Then, in order to determine for sure that this is the case, enter your site at www.yoursite.com into Google, using whatever the name of your site is instead of the words yoursite. If you don’t see any of your pages there, then it’s likely you were Google banned.

Another way to tell if you are truly Google banned is to see if your pages show up in page indexing on Google. Or, if it is a news blog then you can go to www.googlenews.com and if you don’t see your articles there, you will also know you were probably banned from Google and now need to send a Google reconsideration request.

2 Be Polite to Google

Next, make remember that you are sending your Google reconsideration request to a real person who works for Google and someone will actually read your reconsideration request at Google office to be unbanned. Therefore you want to be polite and go into as much detail as possible, as it is better to give too much information than not enough in this situation. Being nice counts in this situation and if you act like a jerk, then it’s likely no one will want to help you.

3 Provide Information about the Domain

List things such as if it was a brand new domain name, tell them some background about your website, and also tell them the rules you think you may have broken. In case, there has been spam click on your account, get to the proofs of the same and write to them about it. This shows them you are serious about resolving the problem when sending the Google reconsideration suggestion. Put down everything that you think someone would need to know in order to know who you are and to jog their memory on why you were banned in the first place. Be sure to do your research so you will understand what his going on and can fully explain it to the Google representatives while sending the reconsideration request to Google.

4 Explain the Solution to the Past Problem

While sending the reconsideration request to Google, tell the representative what you have already done to fix the problem that caused you to be banned. Spell it out in detail and give them your actual page URL to prove it. It’s best to give as much information and data as you can so they will understand what you did to solve the issue. For example, if you had your site linked to bad links, then you must make sure that you remove every one of those and unlink them. Be sure to have removed all spam, or anything else that Google doesn’t approve or like. Then, prove to Google that you did this by showing them the evidence. Or, if you had invalid clicks, which is one of the common reasons to get Google banned, show why the clicks were valid. It takes all this sort of information in detail to make them understand the situation and help you to resolve it. Also, ensure that the changes now made to your website meet the requirements for Google reinclusion. Don’t do even a single thing on your website, which may annoy them.

5 Verify the Website

Next, login to your Google webmaster account and add and verify your site. Then go to http://www.google.com/webmasters/tools/reconsideration. This is the area that you use to put in your reconsideration request to Google to be unbanned. You can also send the information in an email to help@google.com. This is where Google representatives give support to customers. You may have to also sign up for Google webmaster tools once you are logged into your account if you don’t already have it

6 Provide Proof

It’s never a good idea to be an idiot and try to blame Google, or try to say you didn’t have any idea what you did wrong. You need real proof for Google reinclusion, not just blame or acting stupid. Show them the proof of the changes you made for Google reconsideration. And at last, always be considerate and thank them for the time and effort that they are taking to look into your reconsideration request to Google and help you to solve the problems and to get you unbanned from Google so your site can be relisted and you can keep getting the traffic you need to run your business, blog or news site.

7 Be Patient

It can take several weeks for a Google representative to get back with you and answer your Google reinclusion suggestion. They do have a lot of other things to handle and you need to understand that you aren’t the only one who may be having issues. While you are waiting, continue to look over your site and try to make sure all the alleged violations are fixed and good to go.

8 Send Follow Up Email for Google reconsideration

Be sure to send follow-up emails to Google to ask how the request is going and if they know when the situation will be resolved. You probably shouldn’t send one every day, because this could be regarded as you being a pest, but be sure to send one in periodically until you get an answer that you understand and can deal with to solve the Google banned problem.

All in all, it can be a time consuming and complicated process in order for your site to switch your site from Google banned to Google unbanned, but with the proper preparation and information, you should be well on your way to being in their good graces again. However, it’s well worth your efforts, so just follow these steps and Google should get back with you and fix your situation and your site.

Comments (0)
Optimizing for MSN - Tuesday, September 06, 2011

SEO experts often forget that there are three major search engines. While there is no doubt that Google is the number one with the most searches and Yahoo! manages to get about a quarter of the market, MSN has not retired yet. It holds about 10-15 percent of the searches (according to some sources even less – about 5%) but it has a loyal audience that can't be reached through the other two major search engines, so if you plan a professional SEO campaign, you can't afford to skip MSN. In a sense getting high rankings in MSN is similar to getting high rankings for less popular keywords – because competition is not that tough you might be able to get enough visitors from MSN only in comparison to the case when you have optimized for a more popular search engine.

Although optimizing for MSN is different from optimizing for Google and Yahoo!, there are still common rules that will help you to rank high in any search engine. As a rule, if you rank well in Google, chances are that you will rank well in Yahoo! (if you are interested in the tips and tricks for optimizing for Yahoo!, you want to have a look at the Optimizing for Yahoo! Article) and MSN as well. The opposite is not true, however. If you rank well in MSN, there is no guarantee that you'll do the same in Google. So, when you optimize for MSN, keep an eye on your Google ranking as well. It's no good to top MSN and be nowhere in Google (the opposite is more acceptable, if you need to make the choice).

But why is this so? The answer is simple - the MSN algorithm is different and that is why, even if the same pages were indexed, the search results will vary.

The MSN Algorithm

As already mentioned, it is the different MSN algorithm that leads to such drastic results in ranking. Otherwise, MSN, like all search engines, first spiders the pages on the Web, then indexes them in its database and after that applies the algorithm to generate the pages with the search results. So, the first step in optimizing for MSN is the same as for the other search engines – to have a spiderable site. (Have a look at Search Engine Spider Simulator to see how spiders see your site). If your site is not spiderable, then you don't have even a hypothetical chance to top the search results.

There is quite a lot of speculation about the MSN algorithm. Looking at the search results MSN delivers, it is obvious that its search algorithm is not as sophisticated as Google's, or even Yahoo!'s and many SEO experts agree that the MSN search algorithm is years behind its competitors. So, what can you do in this case? Optimize as you did for Google a couple of years ago? You are not far from the truth, though actually is is not that simple.

One of the most important differences is that MSN still relies heavily on metatags, as explained below. None of the other major search engines uses metatags that heavily anymore. It is obvious that metatags give SEO experts a great opportunity for manipulating search results. Maybe metatags are the main reason for the inaccurate search results that MSN often produces.

The second most important difference between MSN and the other major search engines is their approach to keywords. Well, for MSN keywords are very, very important, too, but unlike Google, for MSN onpage factors are dominating, while offpage factors (like backlinks for example), are still of minor importance. Well, it is a safe bet that the importance of backlinks will be changed in the future but for now they are not a primary factor for high rankings.

Keywords, Keywords, Keywords

It is hardly surprising that keywords are the most important item for MSN. What is surprising is that MSN relies too much on them. It is very easy to fool MSN – just artificially inflate your keyword density, put a couple of keywords in file names (and even better – in domain names) and around the top of the page and you are almost done for MSN. But if you do the above-mentioned black hat practices, your joy of topping MSN will not last long because, unless you provide separate pages that are optimized for Google, your stuffed pages might pretty well get you banned from Google. If you decide to have separate pages for Google and MSN, first, it it hardly worth the trouble, and second, the risk of duplicate content penalty can't be ignored.

So, what is the catch? The catch is that if you try to polish your site for MSN and stuff it with keywords, this might get you into trouble with Google, which certainly is worse than not ranking well in MSN. But if you optimize wisely, it is more likely than not that you will rank decently in Google and perform well in Yahoo! and MSN as well.

Metatags

Having meaningful metatags never hurts but with MSN this is even more important because its algorithm still uses them as a primary factor in calculating search results. Having well-written (not stuffed) metatags will help you with MSN and some other minor search engines, while at the same time well-written metatags will not get you banned from Google.

The Description metatag is very important:

<META NAME=”Description” CONTENT=”Place your description here” />

MSNBot reads its content and based on that (in addition to keywords found on page) judges how to classify your site. So if you leave this tag empty (i.e. CONTENT=””), you have missed a vital chance to be noticed by MSN. There is no evidence that MSN uses the other metatags in its algorithm that is why leaving the Description metatag empty is even more unforgivable.

Comments (0)
How to get Traffic from Social Bookmarking sites - Tuesday, September 06, 2011

Sites like digg.com, reddit.com, stumbleupon.com etc can bring you a LOT of traffic. How about getting 20,000 and more visitors a day when your listing hits the front page?
Getting to the front page of these sites is not as difficult as it seems. I have been successful with digg and del.icio.us (and not so much with Reddit though the same steps should apply to it as well) multiple times and have thus compiled a list of steps that have helped me succeed:


 

1Pay attention to your Headlines

Many great articles go unnoticed on social bookmarking sites because their headline is not catchy enough. Your headline is the first (and very often the only) thing users will see from your article, so if you don't make the effort to provide a catchy headline, your chances of getting to the front page are small.
Here are some examples to start with :-

Original headline : The Two Types of Cognition
Modified Headline : Learn to Understand Your Own Intelligence

Original headline: Neat way to organize and find anything in your purse instantly!
Modified Headline : How to Instantly Find Anything in Your Purse

Here is a good blog post that should help you with your headlines.

2Write a meaningful & short description

The headline is very important to draw attention but if you want to keep that attention, a meaningful description is vital. The description must be slightly provocative because this draws more attention but still, never use lies and false facts to provoke interest. For instance, if your write “This article will reveal to you the 10 sure ways to deal with stress once and forever and live like a king from now on.” visitors will hardly think that your story is true and facts-based.

You also might be tempted to use a long tell-it-all paragraph to describe your great masterpiece but have in mind that many users will not bother to read anything over 100-150 characters. Additionally, some of the social bookmarking sites limit descriptions, so you'd better think in advance how to describe your article as briefly as possible.

3Have a great first paragraph

This is a rule that is always true but for successful social bookmarking it is even more important. If you have successfully passed Level 1 (headlines) and Level 2 (description) in the Catch the User's Attraction game, don't let a bad first paragraph make them leave your site.

4Content is king

However, the first paragraph is not everything. Going further along the chain of drawing (and retaining) users' attention, we reach the Content is King Level. If your articles are just trash, bookmarking them is useless. You might cheat users once but don't count on repetitive visits. What is more, you can get your site banned from social bookmarking sites, when you persistently post junk.

5Make it easy for others to vote / bookmark your site

It is best when other people, not you, bookmark your site. Therefore, you must make your best to make it easier for them to do it. You can put a bookmarking button at the end of the article, so if users like your content, they can easily post it. If you are using a CMS, check if there is an extension that allows to add Digg, Del.icio.us, and other buttons but if you are using static HTML, you can always go to the social bookmarking site and copy the code that will add their button to your pages.
Here is a link that should help you add Links for Del.icio.us, Digg, and More to your pages.

6Know when to submit

The time when you submit can be crucial for your attempts to get to the front page. On most social bookmarking sites you have only 24 hours to get to the front page and stay there. So, if you post when most users (and especially your supporters) are still sleeping, you are wasting valuable time. By the time they get up, you might have gone to the tenth page. You'd better try it for yourself and see if it works for you but generally posting earlier than 10 a.m. US Central Time is not good. Many people say that they get more traffic around 3 p.m. US Central Time. Also, workdays are generally better in terms of traffic but the downside is that you have more competitors for the front page than on weekends.

7Submit to the right category

Sometimes a site might not work for you because there is no right category for you. Or because you don't submit to the right category – technology, health, whatever – but to categories like General, Miscellaneous, etc. where all unclassified stuff goes. And since these categories fill very fast, your chance to get noticed decreases.

8Build a top-profile

Not all users are equal on social bookmarking sites. If you are an old and respected user who has posted tons of interesting stuff, this increases the probability that what you submit will get noticed. Posting links to interesting articles on other sites is vital for building a top-profile. Additionally, it is suspicious, when your profile has links to only one site. Many social bookmarking sites frown when users submit their own content because this feels like self-promotion.

9Cooperate with other social bookmarkers

The Lonely Wolf is a suicidal strategy on sites like StubleUpon, Digg, Netscape. Many stories make it to the front page not only because they are great but because they are backed up by your network of friends. If in the first hours after your submittal you get at least 15 votes from your friends and supporters, it is more likely that other users will vote for you. 50 votes can get you to the top page of Digg.

10Submit in English

Linguistic diversity is great but the majority of users are from English-speaking countries and they don't understand exotic languages. So, for most of the social bookmarking sites submitting anything in a language different from English is not recommendable. The languages that are at an especial disadvantage are Chinese, Arabic, Slavic languages and all the other that use non-latin alphabet. German, Spanish, French are more understandable but still they are not English. If you really must submit your story (i.e. because you need the backlink), include an English translation at least of the title. But the best way to proceed with non-English stories is to post them on where they belong. Check this link for a list of non-English sites.

11Never submit old news

Submitting old news will not help you in becoming a respected user. Yesterday's news is history. But if you still need to submit old stuff, consider feature articles, howtos and similar pieces that are up-to-date for a long time.

12Check your facts

You must be flattered that users read your postings but you will hardly be flattered when users prove that you haven't got the facts right. In addition to sarcastic comments, you might also receive negative votes for your story, so if you want to avoid this, check you facts - or your readers will do it.

13Check you spelling

Some sites do not allow to edit your posts later, so if you misspell the title, the URL, or a keyword, it will stay this way forever.

14Not all topics do well

But sometimes even great content and submitting to the right category do not push you to the top. One possible reason could be that your stories are about unpopular topics. Many sites have topics that their users love and topics that don't sell that well. For instance, Apple sells well on Digg and The War in Iraq on Netscape. Negative stories - about George Bush, Microsoft, evil multinational companies, corruption and crime also have a chance to make it to the front page. You can't know these things in advance but some research on how many stories tagged with keywords like yours have made the front page in the last year or so can give you a clue.

15Have Related Articles / Popular Articles

Traffic gurus joke that traffic from social bookmarking sites is like an invasion – the crowds pour in and in a day or two they are gone. Unfortunately this is true – after your listing rolls from the front page (provided that you reached the front page), the drop in traffic is considerable. Besides, many users come just following the link to your article, have a look at it and then they are gone. One of the ways to keep them longer on your site is to have links to Related Articles / Popular Articles or something similar that can draw their attention to other stuff on the site and make them read more than one article.

16RSS feeds, newsletter subscriptions, affiliate marketing

RSS feeds, newsletter subscriptions, affiliate marketing are all areas in which the traffic from social bookmarking sites can help you a lot. Many people who come to your site and like it, will subscribe to RSS feeds and/or your newsletter. So, you need to put these in visible places and then you will be astonished at the number of new subscriptions you got on the day when you were on the front page of a major social bookmarking site.

17Do not use automated submitters

After some time of active social bookmarking, you will discover that you are spending hours on end posting links. Yes, this is a lot of time and using automated submitters might look like the solution but it isn't. Automated submitters often have malware in them or are used for stealing passwords, so unless you don't care about the fate of your profile and don't mind being banned, automated submitters are not the way to go.

18Respond to comments on your stories

Social bookmarking sites are not a newsgroup but interesting articles can trigger a pretty heated discussion with hundreds of comments. If your article gets comments, you must be proud. Always respond to commends on your stories and even better – post comments on other stories you find interesting. This is a way to make friends and to create a top-profile.

19Prepare your server for the expected traffic

This is hardly a point of minor importance but we take for granted that you are hosting your site on a reliable server that does not crash twice a day. But have in mind that your presence on the front page of a major social bookmarking site can drive you a lot traffic, which can cause your server to crash – literally!
I remember one of the times I was on the front page on Digg, I kept restarting Apache on my dedicated server because it was unable to cope with the massive traffic. I have many tools on my site and when the visitors tried them, this loaded the server additionally.
Well, for an articles site getting so much traffic is not so devastating but if you are hosting on a so-so server, you'd better migrate your site to a machine that can handle a lot of simultaneous hits. Also, check if your monthly traffic allowance is enough to handle 200-500,000 or even more visitors. It is very amateurish to attract a lot of visitors and not be able to serve them because your server crashed or you have exceeded your bandwidth!

20The snowball effect

But despite the differences in the likes of the different social bookmarking communities, there are striking similarities. You will soon discover that if a post is popular on one of the major sites, this usually drives it up on the other big and smaller sites. Usually it is Digg posts that become popular on StumbleUpon and Reddit but there are many other examples. To use this fact to your best advantage, you may want to concentrate your efforts on getting to the front page of the major players only and bet on the snowball effect to drive you to the top on other sites.
An additional benefit of the snowball effect is that if your posting is interesting and people start blogging about it, you can get tons of backlinks from their blogs. This happened to me and the result was that my PR jumped to 6 on the next update.

Comments (0)
Choosing SEO as Your Career - Tuesday, September 06, 2011

Its always better to know in advance what you can expect from a career in SEO.

Some Good Reasons to Choose SEO as Your Career

1High demand for SEO services

Once SEO was not a separate profession – Web masters performed some basic SEO for the sites they managed and that was all. But as sites began to grow and make money, it became more reasonable to hire a dedicated SEO specialist than to have the Web master do it. The demand for good SEO experts is high and is constantly on the rise.

2A LOT of people have made a successful SEO career

There are many living proofs that SEO is a viable business. The list is too long to be quoted here but some of the names include Rob from Blackwood Productions, Jill Wahlen from High Rankings, Rand Fishkin from SEO Moz and many others.

3Search Engine Optimizers make Good Money !

SEO is a profession that can be practiced while working for a company or as a solo practitioner. There are many jobboards like Dice and Craigslist that publish SEO job advertisements. It is worth noting that the compensation for SEO employees is equal to or even higher than that of developers, designers and marketers. Salaries over $80K per annum are not an exception for SEO jobs.
As a solo SEO practitioner you can make even more money. Almost all freelance sites have sections for SEO services and offers for $50 an hour or more are quite common. If you are still not confident that you can work on your own, you can start a SEO job, learn a bit and then start your own company.
If you already feel confident that you know a lot about SEO, you can take this quiz and see how you score. Well, don't get depressed if you didn't pass – here is a great checklist that will teach you a lot, even if you are already familiar with SEO.

4Only Web–Designing MAY NOT be enough

Many companies offer turn–key solutions that include Web design, Web development AND SEO optimization. In fact, many clients expect that when they hire somebody to make their site, the site will be SEO friendly, so if you are good both as a designer and a SEO expert, you will be a truely valuable professional.
On the other hand, many other companies are dealing with SEO only because they feel that this way they can concentrate their efforts on their major strentgh – SEO, so you can consider this possibility as well.

5Logical step ahead if you come from marketing or advertising

The Web has changed the way companies do business, so to some extent today's marketers and advertisers need to have at least some SEO knowledge if they want to be successful. SEO is also a great career for linguists.

6Lots of Learning

For somebody who comes from design, development or web administration, SEO might look not technical enough and you might feel that you will downgrade if you move to SEO. Don't worry so much – you can learn a LOT from SEO, so if you are a talented techie, you are not downgrading but you are actually upgrading your skills packages.

7SEO is already recognized as a career

Finally, if you need some more proof that SEO is a great career, have a look at the available SEO courses and exams for SEO practitioners. Well, they might not be a CISCO certification but still they help to institutionalize the SEO profession.

Some Ugly Aspects of SEO

1Dependent on search engines

It is true that in any career there are many things that are outside of your control but for SEO this is a rule number one. Search engines frequently change their algorithms and what is worse – these changes are not made public, so even the greatest SEO gurus admit that they make a lot of educated guesses about how things work. It is very discouraging to make everything perfect and then to learn that due to a change in the algorithm, your sites dropped 100 positions down. But the worst part is that you need to communicate this to clients, who are not satisfied with their sinking ratings.

2No fixed rules

Probably this will change over time but for now the rule is that there are no rules – or at least not written ones. You can work very hard, follow everything that looks like a rule and still success is not coming. Currently you can't even rely on bringing a search engine to court because of the damages they have done to your business because search engines are not obliged to rank high sites that have made efforts to get optimized.

3Rapid changes in rankings

But even if you somehow manage to get to the top for a particular keyword, keeping the position requires constant efforts. Well, many other businesses are like that, so this is hardly a reason to complain – except when an angry customer starts shouting at you that this week their ratings are sinking and of course this is all your fault.

4SEO requires Patience

The SEO professional and customers both need to understand that SEO takes constant effort and time. It could take months to move ahead in the ratings, or to build tens of links. Additionally, if you stop optimizing for some time, most likely you will experience a considerable drop in ratings. You need lots of motivation and patience not to give up when things are not going your way.

5Black hat SEO

Black hat SEO is probably one of the biggest concerns for the would–be SEO practitioner. Fraud and unfair competition are present in any industry and those who are good and ethical suffer from this but black hat SEO is still pretty widespread. It is true that search engines penalize black hat practices but still black hat SEO is a major concern for the industry.

So, let's hope that by telling you about the pros and cons of choosing SEO as your career we have helped you make an informed decision about your future.

Comments (0)
SEO Careers during a Recession - Tuesday, September 06, 2011

I don't know if many people became SEO experts because they planned ahead and thought that SEO careers are relatively stable in the long run, especially when compared to other business areas, or the reasons to make a career in SEO were completely different, but my feeling is that SEO experts are lucky now. Why? Because while the recession makes many industries wrench in pain, many SEO professionals are in top financial shape and full of optimism for the future.

It will be an exaggeration to say that the SEO industry doesn't feel the recession. This is not exactly so but when compared to industries such as automobiles, newspapers, banking, real estate, etc., SEO looks like a coveted island of financial security. This doesn't mean that there is no drop in volumes and everybody in SEO is working for top dollar but as a whole the SEO industry and the separate individuals, who make their living in SEO, are much better than many other employees and entrepreneurs.

What Can You Expect from Your SEO Career During a Recession?

The question about what realistic expectations are is fundamental. I bet there are people in SEO, who are not very happy with their current situation and blame the recession for that. Well, if most of your clients were from troubled industries (cars, real estate, financial services, etc.), then you do have a reason to complain. In such cases you should be happy if you can pay the bills. What you can do (if you haven't already done it) is to look for new customers from other industries.

Another factor that influences your expectations about your SEO career during the recession is your position on the career ladder. It makes a big difference if you work for a company or you are your own boss. Being an employee has always been a more vulnerable position, so if you expect job security, this is easier to achieve when you ares an independent SEO contractor. Mass layoffs might not the common for SEO companies but hired workers are never immune against it.

Additionally, your skill level also affects how your SEO carer will be influenced by the recession. The recession is not the right time for novices to enter SEO. Many people from other industries rush to SEO as a life belt. When these people don't have the right skills and expertise but expect rivers of gold, this inevitably leads to disappointment.

What Makes SEO Careers Recession-Proof?

So, if you are a seasoned SEO practitioner and you don't dream of rivers of gold, you can feel safe with SEO because unlike careers in many other industries SEO careers are relatively recession-proof. Here are some of the reasons why SEO careers are recession-proof:

  • The SEO market is an established market. If you remember the previous recession from the beginning of the century, when the IT industry was among the most heavily stricken, you might be skeptical a bit that now it won't be the same story. No, it is not the same now. SEO is not a new service anymore and the SEO market itself is more established than it was a couple of years ago. This is what makes the present recession different from the previous one – the difference is fundamental and it can't be neglected.

  • SEO is one of the last expenses companies cut. SEO has already become a necessity for companies of any size. Unlike hardware, cars, not to mention entertainment and even business trips, SEO expenses are usually not that big but they help a company to stay aboard. That is why when a company decides to make cuts in the budget, SEO expenses are usually not among the things that get the largest cut (or any cut at all).

  • SEO has great ROI. The Return On Investment (ROI) for money spent on SEO is much higher than the ROI for other types of investments. SEO brings companies money and this is what makes it such a great investment. Stop SEO and the money stops coming as well.

  • Many clients start aggressive SEO campaigns in an attempt to get better results fast. During a recession SEO is even more important. That is why many clients decide that an aggressive SEO campaign will help them get more clients and as a result these clients double their pre-recession budgets.

  • SEO is cheaper than PPC. SEO is just one of the many ways for a site to get traffic. However, it is also one of the most effective ways to drive tons of traffic. For instance, if you consider PPC, the cost advantages of SEO are obvious. PPC is very expensive and as a rule, ranking high in organic search results even for competitive keywords is cheaper than PPC.

  • Cheaper than traditional promotion methods. Traditional promotion methods (i.e. offline marketing) are still an option but their costs are higher even than PPC and the other forms of online promotion. Besides many companies have given offline marketing completely and have turned to SEO as their major way to promote their business and attract new clients.

  • SEO is an recurring expense. Many businesses build their business model around memberships and other forms of recurring payments. For you memberships and other types of recurring payments are presold campaigns – i.e. more or less you know that if the client is happy with a campaign you did for him, he or she will return. Acquiring recurring clients is very beneficial because you have less expenses in comparison to acquiring clients one by one.

The outlook for SEO careers during times of recession is pretty positive. As we already mentioned, it is possible to experience drops in volumes or some of your clients to go the bankruptcy road but as a whole SEO offers more stability than many other careers. If you manage to take advantage of the above mentioned recession-proof specifics of SEO and you are a real professional, you won't have the pleasure to feel the recession in all its bitterness.

Comments (0)
HTML 5 and SEO - Tuesday, September 06, 2011

HTML 5 is still in the making but for any SEO expert, who tries to look ahead, some knowledge about HTML 5 and how it will impact SEO is not unnecessary information. It is true that the changes and the new concepts in HTML 5 will impact Web developers and designers much more than SEO experts but still it is far from the truth to say that HTML 5 will not mean changes in the Organic SEO policy.

What's New in HTML 5?

HTML 5 follows the way the Net evolved in the last years and includes many useful tags and elements. At first glance, it might look as if HTML 5 is going in the direction of a programming language (i.e. PHP) but actually this is not so – it is still an XML–based presentation language. The new tags and elements might make HTML 5 look more complex but this is only at first glance.

HTML 5 is not very different from HTML 4. One of the basic ideas in the development of HTML 5 was to ensure backward compatibility and because of that HTML 5 is not a complete revamp of the HTML specification. So, if you had worries that you will have to start learning it from scratch, these worries are groundless.

How the Changes in HTML 5 Will Affect SEO?

As a SEO expert, you are most likely interested mainly in those changes in the HTML 5 specification, which will affect your work. Here are some of them:

  • Improved page segmentation. Search engines are getting smarter and there are many reasons to believe that even now they are applying page segmentation. Basically, page segmentation means that a page is divided into several separate parts (i.e. main content, menus, headers, footers, links sections, etc.) and these parts are treated as separate entries. At present, there is no way for a Web master to tell search engines how to segment a page but this is bound to change in HTML 5.

  • A new <article> tag. The new <article> tag is probably the best addition from a SEO point of view. The <article> tag allows to mark separate entries in an online publication, such as a blog or a magazine. It is expected that when articles are marked with the <article> tag, this will make the HTML code cleaner because it will reduce the need to use <div> tags. Also, probably search engines will put more weight on the text inside the <article> tag as compared to the contents on the other parts of the page.

  • A new <section> tag. The new <section> tag can be used to identify separate sections on a page, chapter, book. The advantage is that each section can have its separate HTML heading. As with the <article> tag, it can be presumed that search engines will pay more attention to the contents of separate sections. For instance, if the words of a search string are found in one section, this implies higher relevance as compared to when these words are found all across the page or in separate sections.

  • A new <header> tag. The new <header> tag (which is different from the head element) is a blessing for SEO experts because it gives a lot of flexibility. The <header> tag is very similar to the <H1> tag but the difference is that it can contain a lot of stuff, such as H1, H2, H3 elements, whole paragraphs of text, hard–coded links (and this is really precious for SEO), and any other kind of info you feel relevant to include.

  • A new <footer> tag. The <footer> tag might not be as useful as the <header> one but still it allows to include important information there and it can be used for SEO purposes as well. The <header> and <footer> tags can be used many times on one page – i.e. you can have a separate header/footer for each section and this gives really a lot of flexibility.

  • A new <nav> tag. Navigation is one of the important factors for SEO and everything that eases navigation is welcome. The new <nav> tag can be used to identify a collection of links to other pages.

As you see, the new tags follow the common structure of a standard page and each of the parts (i.e. header, footer, main section) has a separate tag. The tags we described here, are just some (but certainly not all) of the new tags in HTML 5, which will affect SEO in some way. For instance, <audio>, <video> or <dialogue> tags are also part of the HTML 5 standard and they will allow to further separate the content into the adequate categories. There are many other tags but they are of relatively lower importance and that is why they are not discussed.

For now HTML 5 is still far in the future. When more pages become HTML 5–compliant, search engines will pay more attention to HTML 5. Only then it will be possible to know how exactly search engines will treat HTML 5 pages. The mass adoption of HTML 5 won't happen soon and it is a safe bet to say that for now you can keep to HTML 4 and have no concerns. Additionally, it will take some time for browsers to adjust to HTML 5, which further delays the moment when HTML 5 will be everywhere.

However, once HTML 5 is accepted and put to use, it will be the dominating standard for the years to come and that is why you might want to keep an eye on what other web masters are doing, just to make sure that you will not miss the moment when HTML 5 becomes the defacto standard.

Comments (0)
How to get traffic from Twitter - Tuesday, September 06, 2011

Twitter is one of the latest and greatest Web 2.0 apps and it gets tons of traffic. However, from the point of view of a SEO expert, it is more important that Twitter can get you tons of traffic as well. So, if you still don't have an account with Twitter, you'd better open one.

Twitter is simple to use and this is what made it so popular. Twitter is fashionable right now, so enjoy the moment. Even the creators of Twitter admit that as with MySpace and other Web 2.0 sensations, Twitter will inevitably go out of fashion some day, so hurry up and get some traffic for free now, when it is still all the rage of the season.

Twitter is simple to use, yet it is really powerful. You might need a couple of hours to get familiar with the basic functionality of Twitter and of some of the extras it has but you can harness its power, even if you don't know it very well.

Unlike most of the other places you can get traffic for free, Twitter is a microblogging platform, which means that there are restrictions on the number of characters in a message. Therefore, you need to be concise in your Tweets and use your space wisely. In addition to being concise, here are some more tips to help you get traffic from Twitter:

1Make your Twitter profile interesting

Your profile and your username are the first two things your visitors will see when they go to your Twitter page. If your profile looks boring, people won't bother to read your tweets, not to mention visit the links you post in them. You can't write a very long bio of yours, but you can enter a few words about you – i.e. your occupation, your interests, etc. You can also include a couple of keywords in your bio.

2 Pick a niche-targeted username

Your username is also very important. You need to pick a username that is targeted at your niche. For instance, if you are promoting your SEO services and want to drive traffic to your SEO site, you can choose something like SEOmaster, SEOguru, SEOservices, etc. Your username will show in searches other users make and this is why you must pay attention to what you choose.

3 Put your site/blog URL in your profile

According to some statistics, 80% of tweeters don't provide an URL in their bio! Well, maybe these people are not SEO experts/Internet marketers and they don't need this traffic but you as a SEO expert can't afford to miss it. So, don't forget to include your URL in your profile!

4 Send the link to your profile to your friends, coworkers, and acquaintances

Your friends, coworkers, and acquaintances will be your most loyal audience, so if they don't know about your Twitter page, make them aware of what they are missing. If you have their emails, or know their accounts on other networks, you can send a mass invite.

5Search for Twitter users with similar interests

You might have millions of friends, but more followers are always welcome. That is why you can use the search functions on Twitter and find people with similar interests. Find as many as you can and invite them all. These people might not be as loyal as your friends, coworkers, and acquaintances but still you will get hits from them as well. Some Twitter users report that about 1-2% of their followers visit their site a day, which means that if you have 1,000 followers, you might expect to get at least 10 or 20 visits a day to your site. This response rate might seem low but there are ways to increase it.

6 Socialize on Twitter as much as you can

When you are active in Twitter, respond to the posts of your followers and visit their links, this seriously increases your chances that you will get the same in return. In a word, actively follow those that follow you.

7 Tweet regularly

As with all other kinds of media, if you want to keep your audience, you need to feed it regularly. Writing a short tweet takes just seconds, but it is enough in order to keep your followers happy. It goes without saying, that you should tweet about useful things, so if you don't have something meaningful to post about you or your sites, it is quite OK to post a link to an article, a video, a blog, etc. you found on the Net and you like.

7Don't spam

You might feel that every single user on Twitter is interested in you and your blog/site but this is not exactly so. You might be tempted to make as many users as you can aware of your Twitter page and your latest tweets but you'd better refrain from doing this, unless you want to see if you can get a ban or not.

8Take advantage of Twitterfeed

Twitterfeed is one more useful service you can take advantage of in order to increase your reach. Go to twitterfeed.com and configure your feeds.

9Make Twitter Search love you

Twitter has a great search function and its main advantage is that it offers real-time results. Google might be fast in indexing pages but its indexing is not real-time. Users are hungry for hot news and nothing beats a real-time search. Many bloggers report that they are getting more traffic from Twitter than from Google and partially this is due to the fact that their tweets are popular and users find them with ease.

10Add Twitter gadgets to your site

There are tons of Twitter gadgets and new ones are being released every day. The cool thing about Twitter gadgets is that your blog visitors can become your Twitter followers. If your Twitter followers have many followers, chances are that some of these followers will notice you and will join your network. As we already mentioned, building a large and targeted network is key to getting more Twitter traffic to your site.

These are some of the main ways in which you can get traffic from Twitter. If you are creative and if you monitor what's going on on Twitter and what new Twitter gadgets are released, you will certainly find more ways to drive traffic from Twitter to your site.

Comments (0)
YouTube Traffic - Tuesday, September 06, 2011

YouTube is one of the most popular sites and in addition to all the fun there, YouTube offers many opportunities for promotion and getting traffic to your site. Similarly to Facebook and Twitter, in order to use YouTube successfully for promotion and getting traffic, you need to know the rules for this. Here are some tips how to promote yourself, your site, and your products and how to get free traffic from YouTube:

    1 Post viral videos

    There are millions of videos on YouTube. If you post a video nobody is interested in, this video will go unnoticed, as millions of other videos. The clue to getting traffic from YouTube is to post useful videos, or even better – viral videos. Viral videos are not only useful videos, but they also tend to appeal to large groups of people. If your video manages to get viral, people will promote it for you and the only thing left for you is to reap the benefits.

    2 Create an interesting profile

    Similarly to Facebook, Twitter, or any other social networking site, an interesting profile is a must. If people like your videos, they will check your profile to learn more about you. When they see that your profile is boring, they won't bother more with you. You can make your profile a bit informal but don't make it as if it were the profile of a crazy teenager – you are using YouTube for business, right?

    3 Include your logo and website in the video

    Your logo and your website URL are your major branding weapons. This is why you must include them in the video. You can include them in the beginning of the video or at the end. It is best to have your logo and URL throughout the whole video because this way you will be gaining lots of exposure but if you can't do it (for instance because of artistic considerations), the beginning and the end of the video will suffice.

    4 Post quality videos

    As already mentioned, there is no shortage of videos on YouTube. Unfortunately, this also means there is no shortage of videos with poor quality. These videos are not favored by viewers, so if you want viewers to watch your videos, make sure that your videos don't have crappy sound and/or blurred pictures. YouTube is not a board for professional videographers, so you can post amateur videos, but make sure their quality is decent.

    5 Promote your videos

    If your videos get viral, you are lucky but you can't count on this. In order to get YouTube traffic, your videos need viewers. You can't rely solely on the fact that viewers will find your videos – you need to promote them. Even viral videos will benefit from a promotion by you.

    6 Make your videos search-friendly

    One of the ways viewers find your videos is through search – both locally on YouTube and on search engines. This is why you need to make your videos search-friendly. To do this, include your major keywords in the title and in the descriptions. Also, pay special attention to the tags. List as many keywords as relevant in the tags, but beware that you don't get spammy.

    7 Post in series

    Standalone videos can become a hit but it is best if you create series of videos and post them once a day/week. This way viewers will know that there will be more and they will be coming to check. Even if you don't create series, at least try to post videos regularly – this builds audience loyalty.

    8 Post video responses

    Video responses are one of the unique things about YouTube and you should take full advantage of it. Search your niche, choose the most popular videos in your niche and post video responses to them. Just be careful that the response you post is related to the video you are responding to and don't make your video response a blatant self-promotion.

    9 Choose the right time to post your videos

    On YouTube, timing is very important because there are peaks in traffic and times when there are not so many viewers. Weekdays (especially Wednesdays and above all - Thursdays) morning or early afternoon US time is the best time to post a general interest video. In order to have your video uploaded in the prime time, you need to plan a bit. Have in mind that for large videos and/or slow Internet connections the upload could take you an hour, so start early.

    10 Keep your videos short

    YouTube doesn't impose limits on the length of videos it publishes but generally long videos are boring. 3 to 5 minutes is the best duration for a video but if required you could go from 1 to 6 minutes. When a video is longer than 6 or 7 minutes, this gets boring and not many people will watch it to the end (where your logo and URL are to be found). 3 to 5 minutes is enough to lay your idea, give some details AND tell viewers to visit your site for more.

    11 Comment on other people's videos and include a link to your site in your comment

    In addition to video responses, you can also use plain good comments. Again, search for popular videos in your niche and comment on them. If your comments are liked by viewers, they will check your profile and probably watch your videos.

    YouTube is a valuable resource to drive traffic to your site and to promote it. The competition there might be fierce, but there is always room for a couple of good videos. Fill this room before your competitors do!

Comments (0)
How to Pick an SEO Friendly Designer - Monday, September 05, 2011
It is very important to hire a SEO-friendly designer because if you don't and your site is designed in a SEO-unfriendly fashion, you can't compensate for this later. This article will tell you how to pick a SEO-friendly designer and save yourself the disappointment of low rankings with search engines.

A Web designer is one of the persons without whom it is not possible to create a site. However, when SEO is concerned, Web designers can be really painful to deal with. While there are many Web designers, who are SEO-proficient, it is still not an exception to stumble upon design geniuses, who are focused only on the graphic aspect of the site. For them SEO is none of their business and they couldn't care less for something as unimportant as good rankings with search engines. Needless to say, if you hire such a designer, don't expect that your site will rank well with search engines.

If you will do SEO on your own, then you might not care a lot about the SEO skills of your Web designer but still there are design issues as we'll see next, which can affect your rankings very badly. When he or she designs the site against SEO rules, then it is not possible to fix this with SEO tricks.

When we say that you need to hire a SEO-friendly designer, we presume that you are a SEO pro and you know SEO but if you aren't, then have a look at the SEO Tutorial and the SEO Checklist. If you have no idea about SEO, then you will hardly be able to select a SEO-friendly designer because you won't know what to look for.

One of the ultimate tests if a designer is SEO-friendly or not is to look at his or her past sites – are they done professionally, especially in the SEO department. If their past sites don't exhibit blatant SEO mistakes, such as the ones we'll list in a second and they rank well, this is a recommendation that this person is worth hiring. Anyway, after you look at past sites, ask the designer if he or she did the SEO for their past sites because in some cases it might be that the client himself or herself has done a lot to optimize the site and this is why the site ranks well.

Here is a checklist of common web design sins that will make your site a SEO disaster. If you notice any or all of the following in the past sites your would-be designer has created, just move to the next designer. These SEO-unfriendly design elements are absolute sins and unless the client made them do it, no designer who would use the below techniques deserves your attention:

1 Rely heavily on Flash

Many designers still believe that Flash is the next best thing after sliced bread. While Flash can be very artistic and make a site look cool (and load forever in the browser), heavily Flash-ed sites are disaster in terms of SEO. Simple HTML sites rank better with search engines and as we point out in Optimizing Flash Sites, if the use of Flash is a must, then an HTML version of the same page is more than mandatory.

2 No internal links, or very few links

Internal links are backlinks and they are very important. Of course, this doesn't mean that all the text on a page must be hyperlinked to all the other pages on the site but if there are only a couple of internal links a page, this is a missed chance to get backlinks.

3 Images, not text for anchors

This is another frequent mistake many designers make. Anchor text is vital in SEO and when your links lack anchor text, this is bad. It is true that for menu items and other page elements, it is much easier to use an image than text because with text you can never be sure it will display correctly on users' screens, but since this is impacting your site's rankings in a negative way, you should sacrifice beauty for functionality.

4 Messy code and tons of code

If you have no idea about HTML, then it might be impossible for you to judge if a site's code is messy and if the amount of code is excessive but cleanness of code is an important criterion for SEO. When the code is messy, it might not be spiderable at all and this can literally exclude your site from search engines because they won't be able to index it.

5 Excessive use of (SEO non-friendly) JavaScript

Similarly to Flash, search engines don't love JavaScript, especially tons of it. Actually, the worst with JavaScript is that if not coded properly, it is quite possible that because of the use of JavaScript your pages (or parts of them) are not spiderable, which automatically means that they won't be indexed.

6 Overoptimized sites

Overoptimized sites aren't better than under-optimized. In fact, they could be much worse because when you keyword stuff and use other techniques (even when they are not Black Hat SEO) to artificially inflate the rankings of the site, this could get you banned from search engines and this is the worst that can happen to a site.

7 Dynamic and other SEO non-friendly URLs

Well, maybe dynamic URLs is not exactly a design issue but if you are getting a turn-key site - i.e. it is not up to you to upload and configure it and to create the links inside - then dynamic URLs are bad and you have to ask the designer/developer not to use them. You can rewrite dynamic and other SEO non-friendly URLs on your own but actually this means to make dramatic changes to the site and this is hardly the point of hiring a designer.

These points are very important and this is why you need to follow them, when you are choosing a SEO-friendly designer. Some of the items on the list are so bad for SEO (i.e. Flash, JavaScript) that even if the site is a design masterpiece and you promote it heavily, you will still be unable to get decent rankings. SEO-friendliness of design is a necessity, not a whim and you shouldn't settle for a SEO-unfriendly designs – this can be really expensive!

Comments (0)
How to Optimize your Website for Mobile Search - Monday, September 05, 2011
mobile search is different from desktop search and if you have lots of mobile visitors, you need to make your site mobile-friendly. Shorter keywords, shorter pages, current info, and compliance with mobile standards are some of the key points to follow in order to make your site suitable for mobile searchers.

It is not only web designers and developers, who need to adapt to webcomake changes to their strategies and tactics, if they want to capture the lucrative mobile search market. Mobile search is a constantly growing segment of the market, which is good news. However, mobile search has its own rules and they are kind of different from the rules of traditional desktop search. This is why if you don't want to miss mobile searchers, you need to adapt to their requirements. Here are some very important rules to consider when optimizing for mobile search:

1 Mobile Searchers Use Shorter Key phrases/Keywords

Mobile users search for shorter keyphrases, or even just for keywords. Even mobile devices with QWERTY keyboards are awkward for typing long texts and this is the reason why mobile searchers usually are very brief in their search queries. Very often the search query is limited to only 2 or even 1 words. As a result, if you don't rank well for shorter keyphrases (unfortunately, they are also more competitive), then you will be missing a lot of mobile traffic.

2 Mobile Search Is Mainly Local Search

Mobile users search mostly for local stuff. In addition to shorter search keyphrases, mobile searchers are also locally targeted. It is easy to understand - when a user is standing in the street and is looking for a place to dine, he or she is most likely looking for things in the neighborhood, not in another corner of the world. Searches like “pizza 5th Avenue” are quite popular, which makes local search results even more important to concentrate on.

3 Current Data Rules in Mobile Search

Sports results, news, weather, financial information are among the most popular mobile search categories. The main topics and niches mobile users prefer are kind of limited but again, they revolve around places to eat or shop in the area, sports results, news, weather conditions, market information, and other similar topics where timing and location are key. If your site is in one of these niches, then you really need to optimize it because if your site is not mobile-friendly chances are you are losing visitors. You could even consider having two separate versions of your site – one for desktop searchers and one for mobile searchers.

4 In Mobile Search, Top 10 Is Actually Top 3

Users hate to scroll down long search pages or hit Next, Next, Next. Desktop searchers aren't fond of scrolling endless pages either but in mobile search the limitations are even more severe. A page with 10 search results fits on the screen of a desktop but on a mobile device it might be split into 2 or more screens. Therefore, in mobile search, it is not Top 10, it is more Top 4, or even Top 3 because only the first 3 or 4 positions are on the first page and have a higher chance to attract the user's attention without having to go to the next page.

5 Promote Your Mobile-Friendly Site

Submit your site to major mobile search engines, mobile portals, and directories. It is great if your visitors come from Google and the other major search engines but if you want to get even more traffic, mobile search engines, mobile portals, and directories are even better. For now these mobile resources work great to bring mobile traffic, so don't neglect them. Very often a mobile user doesn't search with Google, but goes to a portal he or she knows. If your site is listed with this portal, the user will come directly to you from there, not from a search engine. The case with directories is similar – i.e. if you are optimizing the site of a pizza restaurant, then you should submit it to all directories where pizza restaurants and restaurants in general for your location are listed.

6 Follow Mobile Standards

Mobile search standards are kind of different and if you want your site to be spiderable, you need to comply with them. Check the guidelines of W3C to see what mobile standards are. Even if your site doesn't comply with mobile standards, it will still be listed in search results but it will be transcoded by the search engine and the result could be pretty shocking to see. Transcoders convert sites to a mobile format but this is not done in a sophisticated manner and the output might be really unbelievable – and everything but mobile-friendly.

7 Don't Forget Meta.txt

Meta.txt is a special file, where you briefly describe the contents of your site and point the user agent to the most appropriate version for it. Search engine spiders directly index the meta.txt file (provided it is located in the root directory), so even if the rest of your site is not accessible, you will still be included in search results. Meta.txt is similar to robots.txt in desktop search but it also has some similarity with metatags because you can put content it it (as you do with the Description and Keywords metatags). The format of the meta.txt file is colon delimited (as is the format of robots.txt). Each field in the file has the following syntax form <fieldname>:<value>. One of the advantages of meta.txt is that it is easily parsed by humans and search engines.

8 No Long Pages for Mobile Searchers

Use shorter texts because mobile users don't have the time to read lengthy pages. We already mentioned that mobile searchers don't like lengthy keyphrases. Well, they like lengthy pages even less! This is why, if you can make a special, shorter mobile version of your site, this would be great. Short pages don't mean that you should skip your keywords, though. Keywords are really vital for mobile search, so don't exclude them but don't keyword stuff, either.

9 Predictive Search Is Popular With Mobile Searchers

Use phrases, which are common in predictive search. Predictive search is also popular with mobile searchers because it saves typing effort. This is why, if your keywords are among the common predictive search results, this seriously increases your chances to be found. It is true that predictive search keywords change from time to time and you can't always follow them but you should at least give it a try.

10 Preview Your Site on Mobile Devices

Always check how your site looks on a mobile device. With the plethora of devices and screen sizes it is not possible to check your site on absolutely every single device you can think of, but if you can check it at least on a couple of the most important ones, this is more than nothing. Even if you manage to get visitors from mobile search engines, if your site is shown distorted on a mobile screen, these visitors will run away. Transcoding is one reason why a site gets distorted, so it is really a good idea to make your site mobile-friendly instead of to rely on search engines to transcode it and make it a design nightmare in the process.

Mobile search is relatively new but it is a safe bet that it will get a huge boost in the near future. If you are uncertain whether your particular site deserves to be optimized for mobile devices or not, use AdWords Keyword Research Tool to track mobile volumes for your particular keywords. If the volumes are high, or if a particular keyword is doing remarkably well in the mobile search segment, invest more time and effort to optimize for it.

Comments (0)
How to Optimize for Baidu - Monday, September 05, 2011
Baidu is the most popular search engine in China, more popular than Google itself. This is why, if you have visitors from China, it makes sense to optimize your site for Baidu as well. The rules for ranking well with Baidu are similar to the rules of the other search engines, yet there are differences, as we show in the article.

Usually SEO efforts are directed towards achieving top rankings with Google and sometimes with Yahoo and Bing. However, in addition to the Big Three, there are also other search engines that might be of interest to you. In fact, some of these search engines might prove a better option than Google, Yahoo or Bing. If these search engines are used by your target audience, they will be more efficient and it is worth to spend some time optimizing for them.

If you haven't heard about Baidu, don't worry. It is a popular search engine but its reach is not global and this is why many people don't even know about it. Still, Baidu is certainly not just one more search engine to waste your time with. Baidu is big in China and since the population of China is more than a billion, if you rank well with Baidu, this can make quite a difference. In fact, if you are operating globally, not to mention if your visitors are based mainly in China, you can't afford to miss this market. On the Chinese market, the share of Baidu is around 60% and it is the most popular Chinese language search engine. Google is less popular in China than Baidu, so if your traffic comes from the Chinese market, it pays to optimize your site for Baidu.

The algorithm of Baidu is different from the algorithms of Google, Bing, and Yahoo. In a sense, it is less sophisticated and it kinda resembles the algorithms of the other search engines from many years many years ago. Here are some tips what you should do in order to get decent rankings with Baidu:

1 Find the right Chinese keywords

Of course, as with other search engines, keywords are important for good rankings. You need to find the right Chinese keywords to optimize for. This might be a challenge because Chinese has many dialects and the same words have different meanings in different dialects. However, Pinyin Chinese is preferred by Baidu and this is why Pinyin Chinese is your best choice. You should stick to it not only for your keywords but for your content as a whole.

2 You need LOTS of content in Chinese

Even if Chinese is not the official language of your site, you need to have many pages in Chinese. With Baidu, content is king – the same as with other search engines. When generating tons of content in Chinese, follow the official guidelines for what content is acceptable in China because there are strict rules there and if you don't obey, this could cost you not only your good rankings with Baidu.

3 Metatags weight a lot

Similarly to the early days of the other search engines, with Baidu metatags are very important, so don't forget to make your metatags top-notch. However, don't abuse metatags and don't stuff them with keywords.

4 Get a Pinyin Chinese domain name and host your site on a Chinese host

Domain names are important with Baidu as well. In addition to having keywords in your domain name, you need to have a domain name in Pinyin Chinese. You can use a .com, .net, or .cn extension with it. For even better results, host your site on a Chinese host because this gives you an additional bonus with Baidu.

5 Use simple navigation structures

Simple navigation structures are a must with every search engine but for Baidu they matter even more. Baidu won't follow links that are deeply buried in all kinds of messy code or that go many levels deep in the site hierarchy.

6 Watch for duplicate content

Baidu is very strict about duplicate content. With the other search engines you might also have problems, if you have duplicate content but Baidu is even less tolerant. Use a robots.txt to tell what not to index and you are safe.

7 No links to bad neighbors and no link farms

Linking to bad neighbors and getting links from link farms isn't a good idea with any search engine but penalties with Baidu are even more severe, so you need to consider this. Also, don't put too many outbound links on your site because this also affects your Baidu rankings in a negative way.

8 Plan in advance

In Google you can get top rankings for a week (though this certainly isn't the norm and we don't mean you should use blackhat strategies to achieve it) but with Baidu success doesn't come that fast. In Baidu it can take 6 months or more to achieve the good rankings you will achieve in Google overnight and you need to take this into account. For instance, if you are promoting a summer-related site, you should start optimization not later than November, so that when the season comes, your site will have achieved the rankings you want.

9 Baidu doesn't deal with Flash and JavaScript

Flash and JavaScript aren't Google's favorites but Baidu absolutely hates them. This is why you should use Flash and JavaScript only if you provide altenative HTML versions of the content you have incorporated in the Flash/JavaScript. Baidu doesn't like iFrames either, so avoid their use as well.

10 Make sure your site is spiderable

As any other search engine, Baidu uses crawlers, so make sure your site is spiderable. Use a spider simulator to check what is accessible from your site and what isn't.

As you see, optimization for Baidu isn't totally different from optimization for any other search engine but it certainly has its specifics. Follow the rules, be patient and sooner or later success will come to you. When you are done with your Baidu optimization, it won't be a surprise, if your site starts ranking better with Google as well, especially for country-specific searches. When you have so much content in Chinese and a Chinese domain name, this will inevitably help you to achieve better rankings for your Chinese search terms in any other search engine.

Comments (0)
SEO Musts for Local Business - Monday, September 05, 2011

When you are doing business locally, you need local traffic. Maybe you are asking yourself how this is possible, since search engines are global in nature. Read the article and you will learn what you can do to get targeted local traffic to your site.

The Internet might be global in nature, but if your business is local, it makes no sense to concentrate on global reach, when your customers live in your city, or even in your neighborhood. For local businesses getting a global reach is a waste of resources. Instead, you should concentrate on the local community. You might be asking how you can do it, when the Web is global and Google doesn't classify sites according to their location. Here is how you can go local with SEO:

1 Use your location in your keywords.

The first trick is to use your location in your keywords. For example, if you are in London and you sell car insurance, your most important keyphrase should be “car insurance London” because this keyphrase contains your business and your location and will drive people who are looking for car insurance in London in particular.

2 Use your location in metatags

Metatags matter for search engines and you shouldn't miss to include your location, together with your other keywords in the metatags of the pages of your site. Of course, you must have your location in the keywords you use in the body text because otherwise it is a bit suspicious when your body text doesn't have your location as a keyword but your tags are stuffed with it.

3 Use your location in your body text

Keywords in the body text count a lot and you can't afford to skip them. If your web copy is optimized for “car insurance” only, this won't help you rank well with “car insurance London”, so make sure that your location is part of your keywords.

4 Take advantage of Google Places and Yahoo Local

Google Places and Yahoo Local are great places to submit to because they will include you in their listings for a particular location.

5 Create backlinks with your location as anchor text

It could be a bit tricky to get organic backlinks with your location as anchor text because some keywords with location don't sound very natural – for instance, “car insurance London” isn't grammatically correct and you will hardly get an organic inline link with it but you can use it in the Name field to comment on blogs. If the blog is dofollow, you will still get a backlink with anchor text that helps for SEO.

6 Get included in local search engine

Global search engines, such as Google, Bing, or Yahoo can bring you lots of traffic but depending on your location, local search engines might be the real golden mine. A local search engine could mean a search engine for the area (though it is not very likely to have regional search engines) or more likely for your country. For instance, Baidu is a great option, if you are selling on the Chinese market.

7 Get listed in local directories

In addition to local search engines, you need to try your luck with local directories, too. You might think that nobody reads directory listings but this isn't exactly so. For instance, Yellow Pages are one of the first places where people look when searching for a local vendor for a particular product.

8 Run locally-targeted ad campaigns

One of the most efficient ways to drive targeted, local traffic to your site is with the help of locally-targeted ad campaigns. PPC ads and classifieds are the two options that work best – at least for most webmasters.

9 Do occasional checks of your keywords

Occasionally checking the current search volume of your keywords is a good idea because shifts in search volumes are quite typical. Needless to say, if people don't search for “car insurance London” anymore because they have started using other search phrases and you continue to optimize for “car insurance London”, this is a waste of time and money. Also, keep an eye on the keywords your competitors use – this will give you clue which keywords work and which don't.

10 Use social media

Social media can drive more traffic to a site than search engines and for local search this is also true. Facebook, Twitter, and the other social networking sites have a great sales potential because you can promote your business for free and reach exactly the people you need. Local groups on social sites are especially valuable because the participants there are mainly from the region you are interested in.

11 Ask for reviews and testimonials

Client reviews and testimonials are a classical business instrument and these are like letters of recommendation for your business. However, as far as SEO is concerned, they could have another role. There are review sites, where you can publish such reviews and testimonials (or ask your clients to do it) and this will drive business to you. Some of these sites are Yelp and Merchant Circle but it is quite probable that there are regional or national review sites you can also post at.

12 Create separate pages for your different locations

When you have business in several locations, this makes the task a bit more difficult because you can't possibly optimize for all of them – you can't have a keyphrase such as “car insurance London, Berlin, Paris, New York”. In this case the solution is to create separate pages for your different locations. If your locations span the globe, you can also create different sites on different, country-specific domains (i.e. uk.co for GB, .de for Germany, etc.) but this is only reasonable to do, if your business is truly multinational. Otherwise, just a separate page for each of your locations will do.

These simple tips how to optimize your site for local searches are a must, if you rely on the local market. Maybe you are already doing some of them and you know what works for you and what doesn't. Anyway, if you haven't tried them all, try them now and see if this will have a positive impact on your rankings (and your business) or not.

Comments (0)
LiveZilla Live Help